Precise Error Analysis of Regularized M-estimators in High-dimensions

نویسندگان

  • Christos Thrampoulidis
  • Ehsan Abbasi
  • Babak Hassibi
چکیده

A popular approach for estimating an unknown signal x0 ∈ R from noisy, linear measurements y = Ax0 +z ∈ R is via solving a so called regularized M-estimator: x̂ := arg minx L(y−Ax)+λf(x). Here, L is a convex loss function, f is a convex (typically, non-smooth) regularizer, and, λ > 0 is a regularizer parameter. We analyze the squared error performance ‖x̂ − x0‖2 of such estimators in the high-dimensional proportional regime where m,n → ∞ and m/n → δ. The design matrix A is assumed to have entries iid Gaussian; only minimal and rather mild regularity conditions are imposed on the loss function, the regularizer, and on the noise and signal distributions. We show that the squared error converges in probability to a nontrivial limit that is given as the solution to a minimax convex-concave optimization problem on four scalar optimization variables. We identify a new summary parameter, termed the Expected Moreau envelope to play a central role in the error characterization. The precise nature of the results permits an accurate performance comparison between different instances of regularized M-estimators and allows to optimally tune the involved parameters (e.g. regularizer parameter, number of measurements). The key ingredient of our proof is the Convex Gaussian Min-max Theorem (CGMT) which is a tight and strengthened version of a classical Gaussian comparison inequality that was proved by Gordon in 1988.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularizing generalization error estimators: a novel approach to robust model selection

A well-known result by Stein shows that regularized estimators with small bias often yield better estimates than unbiased estimators. In this paper, we adapt this spirit to model selection, and propose regularizing unbiased generalization error estimators for stabilization. We trade a small bias in a model selection criterion against a larger variance reduction which has the beneficial effect o...

متن کامل

A unified framework for high-dimensional analysis of $M$-estimators with decomposable regularizers

High-dimensional statistical inference deals with models in which the the number of parameters p is comparable to or larger than the sample size n. Since it is usually impossible to obtain consistent procedures unless p/n → 0, a line of recent work has studied models with various types of low-dimensional structure, including sparse vectors, sparse and structured matrices, low-rank matrices, and...

متن کامل

An adaptive method for combined covariance estimation and classification

In this paper a family of adaptive covariance estimators is proposed to mitigate the problem of limited training samples for application to hyperspectral data analysis in quadratic maximum likelihood classification. These estimators are the combination of adaptive classification procedures and regularized covariance estimators. In these proposed estimators, the semi-labeled samples (whose label...

متن کامل

A Two-Phase Robust Estimation of Process Dispersion Using M-estimator

Parameter estimation is the first step in constructing any control chart. Most estimators of mean and dispersion are sensitive to the presence of outliers. The data may be contaminated by outliers either locally or globally. The exciting robust estimators deal only with global contamination. In this paper a robust estimator for dispersion is proposed to reduce the effect of local contamination ...

متن کامل

Estimation with Norm Regularization

Analysis of non-asymptotic estimation error and structured statistical recovery based on norm regularized regression, such as Lasso, needs to consider four aspects: the norm, the loss function, the design matrix, and the noise model. This paper presents generalizations of such estimation error analysis on all four aspects compared to the existing literature. We characterize the restricted error...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1601.06233  شماره 

صفحات  -

تاریخ انتشار 2016